Recorded Step Directional Mutation for Faster Convergence
نویسنده
چکیده
Two meta-evolutionary optimization strategies described in this paper accelerate the convergence of evolutionary programming algorithms while still retaining much of their ability to deal with multi-modal problems. The strategies , called directional mutation and recorded step in this paper, can operate independently but together they greatly enhance the ability of evolutionary programming algorithms to deal with fitness landscapes characterized by long narrow valleys. The directional mutation aspect of this combined method uses correlated meta-mutation but does not introduce a full covari-ance matrix. These new methods are thus much more economical in terms of storage for problems with high dimensionality. Additionally, directional mutation is rotationally invariant which is a substantial advantage over self-adaptive methods which use a single variance per coordinate for problems where the natural orientation of the problem is not oriented along the axes. Step-recording is a subtle variation on conventional meta-mutational algorithms which allows desirable meta-mutations to be introduced quickly. Directional mutation, on the other hand, has analogies with conjugate gradient techniques in deterministic optimization algorithms. Together, these methods substantially improve performance on certain classes of problems, without incurring much in the way of cost on problems where they do not provide much benefit. Somewhat surprisingly their effect when applied separately is not consistent. This paper examines the performance of these new methods on several standard problems taken from the literature. These new methods are directly compared to more conventional evolutionary algorithms. A new test problem is also introduced to highlight the difficulties inherent with long narrow valleys.
منابع مشابه
New three-step iteration process and fixed point approximation in Banach spaces
In this paper we propose a new iteration process, called the $K^{ast }$ iteration process, for approximation of fixed points. We show that our iteration process is faster than the existing well-known iteration processes using numerical examples. Stability of the $K^{ast}$ iteration process is also discussed. Finally we prove some weak and strong convergence theorems for Suzuki ge...
متن کاملA Family of Variable Step-Size Normalized Subband Adaptive Filter Algorithms Using Statistics of System Impulse Response
This paper presents a new variable step-size normalized subband adaptive filter (VSS-NSAF) algorithm. The proposed algorithm uses the prior knowledge of the system impulse response statistics and the optimal step-size vector is obtained by minimizing the mean-square deviation(MSD). In comparison with NSAF, the VSS-NSAF algorithm has faster convergence speed and lower MSD. To reduce the computa...
متن کاملThe Wavelet Transform-Domain LMS Adaptive Filter Algorithm with Variable Step-Size
The wavelet transform-domain least-mean square (WTDLMS) algorithm uses the self-orthogonalizing technique to improve the convergence performance of LMS. In WTDLMS algorithm, the trade-off between the steady-state error and the convergence rate is obtained by the fixed step-size. In this paper, the WTDLMS adaptive algorithm with variable step-size (VSS) is established. The step-size in each subf...
متن کاملSingle Directional SMO Algorithm for Least Squares Support Vector Machines
Working set selection is a major step in decomposition methods for training least squares support vector machines (LS-SVMs). In this paper, a new technique for the selection of working set in sequential minimal optimization- (SMO-) type decomposition methods is proposed. By the new method, we can select a single direction to achieve the convergence of the optimality condition. A simple asymptot...
متن کاملOn new faster fixed point iterative schemes for contraction operators and comparison of their rate of convergence in convex metric spaces
In this paper we present new iterative algorithms in convex metric spaces. We show that these iterative schemes are convergent to the fixed point of a single-valued contraction operator. Then we make the comparison of their rate of convergence. Additionally, numerical examples for these iteration processes are given.
متن کامل